Wireless Power Transfer Under Kullback-Leibler Distribution Uncertainty: A Mathematical Framework

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improved Minimax Prediction Under Kullback-Leibler Loss

Let X | μ ∼ Np(μ, vxI) and Y | μ ∼ Np(μ, vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ, and let p(x|μ) and p(y |μ) denote the conditional densities of X and Y . Based on only observing X = x, we consider the problem of obtaining a predictive distribution p̂(y |x) for Y that is close to p(y |μ) as measured by Kullback-Leibler loss. The natural straw man ...

متن کامل

A Generalized Framework for Kullback-Leibler Markov Aggregation

This paper proposes an information-theoretic cost function for aggregating a Markov chain via a (possibly stochastic) mapping. The cost function is motivated by two objectives: 1) The process obtained by observing the Markov chain through the mapping should be close to a Markov chain, and 2) the aggregated Markov chain should retain as much of the temporal dependence structure of the original M...

متن کامل

Generalized Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics subjected to the additive duality of generalized statistics (dual generalized K-Ld) is reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pyth...

متن کامل

Kullback-Leibler Boosting

In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler d...

متن کامل

Kullback-Leibler Divergence for the Normal-Gamma Distribution

We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Wireless Communications Letters

سال: 2020

ISSN: 2162-2337,2162-2345

DOI: 10.1109/lwc.2020.2999416